Philosophical Logic by Burgess John P

Philosophical Logic by Burgess John P

Author:Burgess, John P.
Language: eng
Format: epub, pdf
ISBN: 9781400830497
Publisher: Princeton University Press


4.3 THE PROBABILISTIC THEORY OF INDICATIVE CONDITIONALS

The rival theory involves the notion of probability. If we are considering truth-functional formulas involving the first k sentence letters, then there are 2k possible models or valuations. For present purposes, a probability function π may be defined as simply an assignment of a nonnegative number cv to each such valuation V, with the property that the sum of the cv is one. The probability π(A) of a formula is then the sum of the cv for V that make A true, and the uncertainty 1 − π(A) of A the sum of the cv for V that make A false.

Various elementary laws of the probability calculus found in the first pages of primers on the subject, such as the law π(A ∨ B) = π(A) + π(B) − π(A ∧ B), follow easily from this definition. Notably, tautologies have probability one, countertautologies have probability zero, and tautologically equivalent formulas have the same probability.

Generalizing this last, let us call an argument with truth-functional premises and conclusion probabilistically valid iff for every probability function the uncertainty of its conclusion is no more than the sum of the uncertainties of its premises, and probabilistically countervalid iff the sum of the uncertainties of the premises can be made as low as desired and the uncertainty of the conclusion as high as desired by suitable choice of probability function. Then probabilistic validity and countervalidity coincide with logical validity and invalidity.

Proof. Consider the argument from A1, …, Ak to B. For any probability function, the uncertainty of B is the sum of the cv for each V that makes B false, while the sum of the uncertainties of the Ai is the sum over all i of the sum of the cv for each V that makes Ai false. If the argument is valid, every V that makes B false makes at least one of the Ai false, so every cv appearing in the former sum appears at least once in the latter sum. But if the argument is invalid, there is a W that makes all the Ai true and B false, and for the probability function that sets cv = 1 for V = W and cv = 0 for V ≠ W, the sum of the uncertainties of the premises is zero and the uncertainty of the conclusion one.

The probability of B conditional on A is given by π(B | A) = π(A ∧ B)/π(A). In general it is not equal to the probability of the material conditional, given by π(A ⊃ B) = 1 − π(A ∧ ¬B). If we write a and b and c for the probabilities of A ∧ B and A ∧ ¬B and ¬A, then π(B | A) = a/(a + b) while π(p ⊃ q) = 1 − b = a + c. We always have π(B | A) ≤ π(A ⊃ B), so whenever π(B | A) is high, π(A ⊃ B) will be high, but the converse



Download



Copyright Disclaimer:
This site does not store any files on its server. We only index and link to content provided by other sites. Please contact the content providers to delete copyright contents if any and email us, we'll remove relevant links or contents immediately.